What is Foopipes

Foopipes is a lightweight integration engine for moving json and binary data from external services, transformation of the data, and storing the data into local or external data storages.

Foopipes is built on an event driven workflow-like architecture using messaging.

Data flows are orchestrated either with configuration files, or using Node.js Javascript modules.

Data transformations are accomplished with user written Javascript modules running in Node.js.

Foopipes is scalable, distributed and robust.

The functionality of Foopipes can be extended with third party plug-ins.

One of the design goals with Foopipes is the possibility to quickly set it up for common usage scenarios.

Foopipes runs in Docker or stand-alone.

What is Foopipes

Purpose of Foopipes

  • Remove hard dependencies to external systems.
  • Create an abstraction between services.
  • Break monoliths when moving to a microservice architecture.
  • Transform, restructure and enrich data before consumption.
  • Combine data from multiple data sources.
  • Free text search index data from multiple data sources.

Problem Definition

Modern websites often has many dependencies to external systems, either on the internet or inside an organisation. Often these systems don't have all the functionality, SLA, or performance to withstand the demands of business critical consumers. Thus, the saying is "when it blows, the leafs shake" is a reality to mitigate.

Also, the evolution of a website is often done in phases. When that is the case, it is preferable to break up the system into smaller pieces where each piece can have its own life cycle, different demands and different product owners. Still, much of the content administration should be done in the same interface. The era of monoliths is coming to an end.

Vision

  • Data is more distributed than ever. Being dependent of many external sources makes it near impossible to build a robust system.
  • Multiple data consumers makes schema changes to a challenge.
  • Technology is moving faster than ever. Abstraction is a necessity.
  • Aggregated data can provide more value than each dataset on its own.